Adobe proposes a way to protect artists from AI rip-offs

6 min read As the engine powering the world’s digital artists, Adobe has a big responsibility to mitigate the rise of AI-driven deepfakes, misinformation, and content theft. October 09, 2024 11:38 Adobe proposes a way to protect artists from AI rip-offs

 In the first quarter of 2025, Adobe is launching its Content Authenticity web app in beta, allowing creators to apply content credentials to their work, certifying it as their own.


This isn’t so simple as altering an image’s metadata — that kind of protection is too easily thwarted by screenshots. Content credentials take provenance a step further. Adobe’s system uses digital fingerprinting, invisible watermarking, and cryptographically signed metadata to more securely protect an artwork, including images, videos, and audio files.


Invisible watermarking makes changes to pixels that are so minute that they evade detection by the human eye. The digital fingerprint operates similarly, encoding an ID into the file to make sure that even if the content credentials are removed, the file can still be identified as belonging to its original creator.


Adobe’s senior director of Content Authenticity Andy Parsons told TechCrunch that with this kind of technology, Adobe can “truly say that wherever an image, or a video, or an audio file goes, on anywhere on the web or on a mobile device, the content credential will always be attached to it.”


Opt-in initiatives like this are only as strong as their adoption. But if any company can reach a quorum of digital artists and creators, it’s Adobe, which has 33 million subscribers who pay for its software. And even artists who aren’t Adobe users can use the web app to apply content credentials.


Then there’s the issue of making content credentials accessible across the internet. Adobe co-founded two industry groups that work to preserve content authenticity and bolster trust and transparency online — their membership includes camera manufacturers representing 90% of the market, content-creation tools from Microsoft and OpenAI, and platforms like TikTok, LinkedIn, Google, Instagram, and Facebook. These companies’ membership doesn’t mean that they will integrate Adobe’s content credentials into their products, but it does mean Adobe has their ears.


Still, not all social media platforms and websites visibly display provenance information.



“In the meantime, to bridge that gap, we’re going to release the Content Authenticity browser extension for Chrome as part of this software package, and also something we call the Inspect tool within the Adobe Content Authenticity website,” Parsons said. “These will help you discover and display content credentials wherever they are associated with content anywhere on the web, and that can show you again who made the content, who gets credit for it.”


Ironically, AI is not very good at telling whether something is AI or not. As it becomes harder to distinguish real images from synthetic ones, these tools could offer a more concrete method of determining an image’s origin (so long as it has credentials).


Adobe isn’t against the use of AI. Rather, the company is trying to make it clear when AI is used in an artwork, and prevent artists’ work from being used in training datasets without consent. Adobe even has its own generative AI tool called Firefly, which is trained on Adobe Stock images.


“Firefly is commercially safe, and we only train it on content that Adobe explicitly has permission to use, and of course, never on customer content,” Parsons said.


Although artists have shown great resistance to AI tools, Parsons says that Adobe’s Firefly integrations in apps like Photoshop and Lightroom have received positive feedback. The generative fill feature in Photoshop, which can extend images through prompting, saw a 10x adoption rate over a typical Photoshop feature, Parsons said.


Adobe has also been working with Spawning, another tool to help artists retain control over how their works are used online. Through its website called “Have I Been Trained?,” Spawning allows artists to search to see if their artworks are present in the most popular training datasets. Artists can add their works to a Do Not Train registry, which signals to AI companies that this work shouldn’t be included in training datasets. This is only effective if AI companies honor the list, but so far, Hugging Face and Stability are on board.


On Tuesday, Adobe is launching the beta version of the Content Authenticity Chrome extension. Creators can also sign up to be notified when the beta for the full web app launches next year.

User Comments (0)

Add Comment
We'll never share your email with anyone else.

img